Convergence Theory for Preconditioned Eigenvalue Solvers in a Nutshell
نویسندگان
چکیده
منابع مشابه
Convergence Theory for Preconditioned Eigenvalue Solvers in a Nutshell
Preconditioned iterative methods for numerical solution of large matrix eigenvalue problems are increasingly gaining importance in various application areas, ranging from material sciences to data mining. Some of them, e.g., those using multilevel preconditioning for elliptic differential operators or graph Laplacian eigenvalue problems, exhibit almost optimal complexity in practice, i.e., thei...
متن کاملA geometric theory for preconditioned inverse iteration. III: A short and sharp convergence estimate for generalized eigenvalue problems
In two previous papers by Neymeyr: A geometric theory for preconditioned inverse iteration I: Extrema of the Rayleigh quotient, LAA 322: (1-3), 61-85, 2001, and A geometric theory for preconditioned inverse iteration II: Convergence estimates, LAA 322: (1-3), 87-104, 2001, a sharp, but cumbersome, convergence rate estimate was proved for a simple preconditioned eigensolver, which computes the s...
متن کاملA Geometric Convergence Theory for the Preconditioned Steepest Descent Iteration
Preconditioned gradient iterations for very large eigenvalue problems are efficient solvers with growing popularity. However, only for the simplest preconditioned eigensolver, namely the preconditioned gradient iteration (or preconditioned inverse iteration) with fixed step size, sharp non-asymptotic convergence estimates are known. These estimates require a properly scaled preconditioner. In t...
متن کاملA Geometric Theory for Preconditioned Inverse Iteration Ii: Convergence Estimates
The topic of this paper is a convergence analysis of preconditioned inverse iteration (PINVIT). A sharp estimate for the eigenvalue approximations is derived; the eigenvector approximations are controlled by an upper bound for the residual vector. The analysis is mainly based on extremal properties of various quantities which define the geometry of PINVIT.
متن کاملConvergence analysis of a locally accelerated preconditioned steepest descent method for Hermitian-definite generalized eigenvalue problems
By extending the classical analysis techniques due to Samokish, Faddeev and Faddeeva, and Longsine and McCormick among others, we prove the convergence of preconditioned steepest descent with implicit deflation (PSD-id) method for solving Hermitian-definite generalized eigenvalue problems. Furthermore, we derive a nonasymptotic estimate of the rate of convergence of the PSD-id method. We show t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Foundations of Computational Mathematics
سال: 2015
ISSN: 1615-3375,1615-3383
DOI: 10.1007/s10208-015-9297-1